On Blackbox Backpropagation and Jacobian Sensing

نویسندگان

  • Krzysztof Choromanski
  • Vikas Sindhwani
چکیده

From a small number of calls to a given “blackbox" on random input perturbations, we show how to efficiently recover its unknown Jacobian, or estimate the left action of its Jacobian on a given vector. Our methods are based on a novel combination of compressed sensing and graph coloring techniques, and provably exploit structural prior knowledge about the Jacobian such as sparsity and symmetry while being noise robust. We demonstrate efficient backpropagation through noisy blackbox layers in a deep neural net, improved data-efficiency in the task of linearizing the dynamics of a rigid body system, and the generic ability to handle a rich class of input-output dependency structures in Jacobian estimation problems.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Calculus of Jacobian Adaptation

For many problems, the correct behavior of a model depends not only on its input-output mapping but also on properties of its Jacobian matrix, the matrix of partial derivatives of the model’s outputs with respect to its inputs. This paper introduces the J-prop algorithm, an efficient general method for computing the exact partial derivatives of a variety of simple functions of the Jacobian of a...

متن کامل

Modified Kalman filter based method for training state-recurrent multilayer perceptrons

Kalman filter based training algorithms for recurrent neural networks provide a clever alternative to the standard backpropagation in time. However, these algorithms do not take into account the optimization of the hidden state variables of the recurrent network. In addition, their formulation requires Jacobian evaluations over the entire network, adding to their computational complexity. In th...

متن کامل

A Layer-by-Layer Levenberg-Marquardt algorithm for Feedforward Multilayer Perceptron

The error backpropagation (EBP) algorithm for training feedforward multilayer perceptron (FMLP) has been used in many applications because it is simple and easy to implement. However, its gradient descent method prevents EBP algorithm from converging fast. To overcome the slow convergence of EBP algorithm, the second order methods have adapted. Levenberg-Marquardt (LM) algorithm is estimated to...

متن کامل

Unifying Adversarial Training Algorithms with Flexible Deep Data Gradient Regularization

We present DataGrad, a general back-propagation style training procedure for deep neural architectures that uses regularization of a deep Jacobian-based penalty. It can be viewed as a deep extension of the layerwise contractive auto-encoder penalty. More importantly, it unifies previous proposals for adversarial training of deep neural nets – this list includes directly modifying the gradient, ...

متن کامل

Levenberg-Marquardt Algorithm for Karachi Stock Exchange Share Rates Forecasting

Financial forecasting is an example of signal processing problems. A number of ways to train/learn the network are available. We have used Levenberg-Marquardt algorithm for error back-propagation for weight adjustment. Pre-processing of data has reduced much of the variation at large scale to small scale, reducing the variation of training data. Keywords— Gradient descent method, jacobian matri...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017